4,681 research outputs found

    The meaning and experience of well-being in dementia for psychiatrists involved in diagnostic disclosure: a qualitative study

    Get PDF
    Literature indicates that people's experiences of receiving a diagnosis of dementia can have a lasting impact on well-being. Psychiatrists frequently lead in communicating a diagnosis but little is known about the factors that could contribute to potential disparities between actual and best practice with regard to diagnostic disclosure. A clearer understanding of psychiatrists’ subjective experiences of disclosure is therefore needed to improve adherence to best practice guidelines and ensure that diagnostic disclosure facilitates living well with dementia. This study utilized qualitative methodology. Semi-structured interviews conducted with 11 psychiatrists were analyzed using Interpretive Phenomenological Analysis (IPA). Three superordinate and nine subordinate themes emerged from the data analysis. These included the following: (i) “The levels of well-being” (Continuing with life, Keeping a sense of who they are, Acceptance of the self), (ii) “Living well is a process” (Disclosure can set the scene for well-being, Positive but realistic messages, Whose role it is to support well-being?), and (iii) Ideal care versus real care (Supporting well-being is not prioritized, There isn't time, The fragmentation of care). Findings indicate that psychiatrists frame well-being in dementia as a multi-faceted biopsychosocial construct but that certain nihilistic attitudes may affect how well-being is integrated into diagnostic communication. Such attitudes were linked with the perceived threat of dementia and limitations of post-diagnostic care. Behaviors used to manage the negative affect associated with ethical and clinical tensions triggered by attempts to facilitate well-being at the point of diagnosis, and their impact on adherence to best practice disclosure, are discussed

    Living positively with dementia: a systematic review and synthesis of the qualitative literature

    Get PDF
    Objective: Little is known about how and to what extent people with dementia live positively with their condition. This study aimed to review and carry out a synthesis of qualitative studies where accounts of the subjective experiences of people with dementia contained evidence of positive states, experiences or attributes. Methods: A meta-synthesis was undertaken to generate an integrated and interpretive account of the ability of people with dementia to have positive experiences. A methodological quality assessment was undertaken to maximize the reliability and validity of this synthesis and to contextualize the findings with regard to methodological constraints and epistemological concepts. Findings: Twenty-seven papers were included. Three super-ordinate themes relating to positive experiences and attributes were identified, each with varying and complementing sub-themes. The first super-ordinate theme related to the experience of engaging with life in ageing rather than explicitly to living with dementia. The second theme related to engaging with dementia itself and comprised the strengths that people can utilize in facing and fighting the condition. The third theme captured how people with dementia might transcend the condition and seek ways to maintain identity and even achieve personal growth. Conclusions: This review provides a first step towards understanding what conceptual domains might be important in defining positive outcomes for people who live with dementia. Highlighting the potential for people to have positive experiences in spite of or even because of their dementia has important implications for de-stigmatizing dementia and will enhance person-centred approaches to care

    A forensically-enabled IASS cloud computing architecture

    Get PDF
    Current cloud architectures do not support digital forensic investigators, nor comply with today’s digital forensics procedures largely due to the dynamic nature of the cloud. Whilst much research has focused upon identifying the problems that are introduced with a cloud-based system, to date there is a significant lack of research on adapting current digital forensic tools and techniques to a cloud environment. Data acquisition is the first and most important process within digital forensics – to ensure data integrity and admissibility. However, access to data and the control of resources in the cloud is still very much provider-dependent and complicated by the very nature of the multi-tenanted operating environment. Thus, investigators have no option but to rely on cloud providers to acquire evidence, assuming they would be willing or are required to by law. Furthermore, the evidence collected by the Cloud Service Providers (CSPs) is still questionable as there is no way to verify the validity of this evidence and whether evidence has already been lost. This paper proposes a forensic acquisition and analysis model that fundamentally shifts responsibility of the data back to the data owner rather than relying upon a third party. In this manner, organisations are free to undertaken investigations at will requiring no intervention or cooperation from the cloud provider. The model aims to provide a richer and complete set of admissible evidence than what current CSPs are able to provide

    Antisemitism in Classical Music: from Wagner to Shostakovich

    Get PDF

    An investigation of the microscale geometry and liquid flow through an isolated foam channel network

    Get PDF
    Liquid foams represent an extremely diverse and highly functional form of soft matter, whose application is widespread throughout industry. These can range from luxurious and low calorie applications in food and beverages, structural and insulating properties in building and manufacture, right through to dynamic and transport abilities in the petrochemical industry, among others. A common feature of all of these foams is they are required to exhibit longevity, however as foams are thermodynamically unstable systems, this is not always a trivial feat. Foams are highly complex systems, with dynamic processes occurring on the molecular scale that influence properties at the scale of individual bubbles and subsequently at the macroscopic scale of bulk foams. A particular challenge of foam research is to unite these length scale processes, requiring robust theoretical and experimental studies to be made at all size regimes. This PhD thesis is concerned specifically with the microscale process of liquid flow between bubbles, as these liquid channels form the primary network through which liquid ‘drains’ through a foam under the force of gravity; one of the key mechanisms governing foam instability. The initial focus of this PhD thesis was the design and implementation of an experimental technique to isolate and image liquid foam channels formed under controlled liquid flow rates. This was developed with a view to producing highly accurate and reproducible measurements of the channel geometries, which would enable the comparison to theory derived to describe such systems. Measurements of low molecular weight surfactants and higher molecular weight emulsifiers clearly demonstrated three previously unseen geometries of foam channel that could not be described using existing theory. Instead, a new geometric model was developed which was able to account for these differences, relating the bulk and surface properties of the foam channel to its length and the rate of liquid flow passing through it. When used as a fitting parameter, the new model was able to clearly demarcate between the characteristic low and high surface viscosities of the surfactant and emulsifier species respectively. The surface viscosity of the surfactant foam channel interfaces was examined throughout this PhD study, as the values extracted from model fitting were consistently lower than the majority found in literature, but in line with predictions made from hyper-sensitive measurement techniques. Ultimately, it was proposed that these differences could be attributed to a combination of the limited measurement sensitivity of commercial systems, combined with a liquid flow velocity dependence of surfactant concentration at the channel surface. It was suggested that, in the case of low molecular weight surfactants, a surface tension gradient can exist along the length of a foam channel, that is dependent upon the rate of liquid flow, the concentration of surfactant and the rate of surfactant adsorption to the interface. In the case of high liquid flow velocities, it was shown that surface tensions in some channel regions could be almost as high as pure water, despite surfactant concentrations being above the CMC. As such, this could have significant consequences for stability in macroscopic foams where these conditions are present

    Preferences and Positivist Methodology in Economics

    Get PDF
    I distinguish several doctrines that economic methodologists have found attractive, all of which have a positivist flavour. One of these is the doctrine that preference assignments in economics are just shorthand descriptions of agents' choice behaviour. Although most of these doctrines are problematic, the latter doctrine about preference assignments is a respectable one, I argue. It doesn't entail any of the problematic doctrines, and indeed it is warranted independently of them

    Dynamic motion coupling of body movement for input control

    Get PDF
    Touchless gestures are used for input when touch is unsuitable or unavailable, such as when interacting with displays that are remote, large, public, or when touch is prohibited for hygienic reasons. Traditionally user input is spatially or semantically mapped to system output, however, in the context of touchless gestures these interaction principles suffer from several disadvantages including memorability, fatigue, and ill-defined mappings. This thesis investigates motion correlation as the third interaction principle for touchless gestures, which maps user input to system output based on spatiotemporal matching of reproducible motion. We demonstrate the versatility of motion correlation by using movement as the primary sensing principle, relaxing the restrictions on how a user provides input. Using TraceMatch, a novel computer vision-based system, we show how users can provide effective input through investigation of input performance with different parts of the body, and how users can switch modes of input spontaneously in realistic application scenarios. Secondly, spontaneous spatial coupling shows how motion correlation can bootstrap spatial input, allowing any body movement, or movement of tangible objects, to be appropriated for ad hoc touchless pointing on a per interaction basis. We operationalise the concept in MatchPoint, and demonstrate the unique capabilities through an exploration of the design space with application examples. Finally, we explore how users synchronise with moving targets in the context of motion correlation, revealing how simple harmonic motion leads to better synchronisation. Using the insights gained we explore the robustness of algorithms used for motion correlation, showing how it is possible to successfully detect a user's intent to interact whilst suppressing accidental activations from common spatial and semantic gestures. Finally, we look across our work to distil guidelines for interface design, and further considerations of how motion correlation can be used, both in general and for touchless gestures

    Multi-Level Selection and the Explanatory Value of Mathematical Decompositions

    Get PDF
    Do multi-level selection explanations of the evolution of social traits deepen the understanding provided by single-level explanations? Central to the former is a mathematical theorem, the multi-level Price decomposition. I build a framework through which to understand the explanatory role of such non-empirical decompositions in scientific practice. Applying this general framework to the present case places two tasks on the agenda. The first task is to distinguish the various ways of suppressing within-collective variation in fitness, and moreover to evaluate their biological interest. I distinguish four such ways: increasing retaliatory capacity, homogenising assortment, and collapsing either fitness structure or character distribution to a mean value. The second task is to discover whether the third term of the Price decomposition measures the effect of any of these hypothetical interventions. On this basis I argue that the multi-level Price decomposition has explanatory value primarily when the sharing-out of collective resources is 'subtractable'. Thus its value is more circumscribed than its champions Sober and Wilson (1998) suppose
    corecore